Simultaneous Estimation of Nongaussian Components and Their Correlation Structure
نویسندگان
چکیده
The statistical dependencies that independent component analysis (ICA) cannot remove often provide rich information beyond the linear independent components. It would thus be very useful to estimate the dependency structure from data. While such models have been proposed, they have usually concentrated on higher-order correlations such as energy (square) correlations. Yet linear correlations are a fundamental and informative form of dependency in many real data sets. Linear correlations are usually completely removed by ICA and related methods so they can only be analyzed by developing new methods that explicitly allow for linearly correlated components. In this article, we propose a probabilistic model of linear nongaussian components that are allowed to have both linear and energy correlations. The precision matrix of the linear components is assumed to be randomly generated by a higher-order process and explicitly parameterized by a parameter matrix. The estimation of the parameter matrix is shown to be particularly simple because using score-matching (Hyvärinen, 2005 ), the objective function is a quadratic form. Using simulations with artificial data, we demonstrate that the proposed method improves the identifiability of nongaussian components by simultaneously learning their correlation structure. Applications on simulated complex cells with natural image input, as well as spectrograms of natural audio data, show that the method finds new kinds of dependencies between the components.
منابع مشابه
Sparse Code Shrinkage: Denoising of Nongaussian Data by Maximum Likelihood Estimation
Sparse coding is a method for finding a representation of data in which each of the components of the representation is only rarely significantly active. Such a representation is closely related to redundancy reduction and independent component analysis, and has some neurophysiological plausibility. In this article, we show how sparse coding can be used for denoising. Using maximum likelihood e...
متن کاملCommon and Cluster-Specific Simultaneous Component Analysis
In many fields of research, so-called 'multiblock' data are collected, i.e., data containing multivariate observations that are nested within higher-level research units (e.g., inhabitants of different countries). Each higher-level unit (e.g., country) then corresponds to a 'data block'. For such data, it may be interesting to investigate the extent to which the correlation structure of the var...
متن کاملDenoising of Sensory Data by Maximum Likelihood Estimation of Sparse Components
Sparse coding is a method for nding a representation of data in which each of the components of the representation is only rarely signiicantly active. Such a representation is closely related to redundancy reduction and independent component analysis, and has some neurophysiological plausibility. In this paper, we show how sparse coding can be used for denoising. Using maximum likelihood estima...
متن کاملDenoising of Nongaussian Data by Independent Component Analysis and Sparse Coding
Sparse coding is a method for nding a representation of data in which each of the components of the representation is only rarely signiicantly active. Such a representation is closely related to redundancy reduction and independent component analysis, and has some neurophysiological plausibility. In this paper, we show how sparse coding can be used for denoising. Using maximum likelihood estima...
متن کاملNon-Gaussian Source-Filter and Independent Components Generalizations of Spectral Flatness Measure
Spectral Flatness Measure is a well-known method for quantifying the amount of randomness (or “stochasticity”) that is present in a signal. This measure has been widely used in signal compression, audio characterization and retrieval. In this paper we present an information-theoretic generalization of this measure that is formulated in terms of a rate of growth of multi-information of a nonGaus...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Neural computation
دوره 29 11 شماره
صفحات -
تاریخ انتشار 2017